Questões de Concurso
Filtrar (abrir filtros)
345 Questões de concurso encontradas
345 resultados
Página 69 de 69
Questões por página:
Cargo: Técnico Judiciário - Tecnologia da Informação
Ano: 2010
Atenção: As questões de números 53 a 60 referem-se ao texto abaixo.
Data mining
From Wikipedia, the free encyclopedia
Jump to: navigation, search
Not to be confused with information extraction.
Data mining is the process of extracting patterns from data. Data mining is seen as an increasingly important tool by modern business to transform data into an informational advantage. It is currently used in a wide range of profiling practices, such as marketing, surveillance, fraud detection, and scientific discovery.
The related terms data dredging, data fishing and data snooping refer to the use of data mining techniques on sample portions of the larger population data set that are (or may be) too small for reliable statistical inferences to be made about the validity of any patterns discovered (see also data-snooping bias). These techniques can, however, be used in the creation of new hypotheses to test against the larger data populations.
[edit] Background
The manual extraction of patterns from data has occurred for centuries. Early methods of identifying patterns in data include Bayes' theorem (1700s) and regression analysis (1800s). The proliferation, ubiquity and increasing power of computer technology has increased data collection and storage. As data sets have grown in size and complexity, direct hands-on data analysis has increasingly been augmented with indirect, automatic data processing. This has been aided by other discoveries in computer science, such as neural networks, clustering, genetic algorithms (1950s), decision trees (1960s) and support vector machines (1980s). Data mining is the process of applying these methods to data with the intention of uncovering hidden patterns. It has been used for many years by businesses, scientists and governments to sift through volumes of data such as airline passenger trip records, census data and supermarket scanner data to produce market research reports. (Note, however, that reporting is not always considered to be data mining.)
A primary reason for using data mining is to assist in the analysis of collections of observations of behaviour. Such data are vulnerable to collinearity because of unknown interrelations. An unavoidable fact of data mining is that the (sub-)set(s) of data being analysed may not be representative of the whole domain, and [CONJUNCTION] may not contain examples of certain critical relationships and behaviours that exist across other parts of the domain. To address this sort of issue, the analysis may be augmented using experiment-based and other approaches, such as Choice Modelling for humangenerated data. In these situations, inherent correlations can be either controlled for, or removed altogether, during the construction of the experimental design.
Cargo: Técnico Judiciário - Tecnologia da Informação
Ano: 2013
Atenção: Para responder às questões de números 48 a 50, considere o texto abaixo.
Software Evaluation: Criteria-based Assessment
Mike Jackson, Steve Crouch and Rob Baxter
Criteria-based assessment is a quantitative assessment of the software in terms of sustainability, maintainability, and usability. This can inform high-level decisions on specific areas for software improvement.
Open Source Initiative
A criteria-based assessment gives a measurement of quality in a number of areas. These areas are derived from ISO/IEC 9126-1 Software engineering − Product quality and include usability, sustainability and maintainability.
The assessment involves checking whether the software, and the project that develops it, conforms to various characteristics or exhibits various qualities that are expected of sustainable software. The more characteristics that are satisfied, the more sustainable the software. Please note that not all qualities have equal weight e.g. having an OSI-approved open source licence is of more importance than avoiding TAB characters in text files.
In performing the evaluation, you may want to consider how different user classes affect the importance of the criteria. For example, for Usability-Understandability, a small set of well-defined, accurate, task-oriented user documentation may be comprehensive for Users but inadequate for Developers. Assessments specific to user classes allow the requirements of these specific user classes to be factored in and so, for example, show that a project rates highly for Users but poorly for Developers, or vice versa.
Scoring can also be affected by the nature of the software itself e.g. for ...A... one could envisage an application that has been well-designed, offers context-sensitive help etc. and consequently is so easy to use that tutorials aren’t needed. Portability can apply to both the software and its development infrastructure e.g. the open source software OGSA-DAI2 can be built, compiled and tested on Unix, Windows or Linux (and so is highly portable for Users and User-Developers). However, its Ruby test framework cannot yet run on Windows, so running integration tests would involve the manual setup of OGSA-DAI servers (so this is far less portable for Developers and, especially, Members).
(Adaptado de: http://africanpot.org/index.php/resource-center/resource-library/func-startdown/27/)
Cargo: Técnico Judiciário - Tecnologia da Informação
Ano: 2012
Atenção: Considere o texto abaixo para responder às questões de números 56 a 60.
London becomes 4G high speed internet hotspot
London will begin to switch on 4G high-speed mobile internet with the launch of the first large-scale public trial in Britain. Initiated by O2, Britain's second largest operator with 22 million customers, the trial involves more than 25 masts covering 15 square miles. It will run for nine months, and the equipment installed will eventually become part of O2's first commercial 4G network.
The technology is 10 times faster at navigating the internet than the current 3G networks, which often frustrate smartphone users because they are significantly slower than the average home broadband connection. The 25 masts in London will be able to carry more data than O2's entire national 3G network.
Britain's 4G or long-term evolution (LTE) upgrade, expected to begin in earnest in 2013 after a much delayed spectrum auction, will make mobile networks powerful enough to handle video calls, high definition TV and live multi-player gaming. About 1,000 users will be invited to join the London trial.
Initially, the O2 trial will not involve phones, because no compatible handsets exist yet. Samsung dongles will be handed out to plug into tablets and laptop computers, as will portable miniature modems that can create small WI-FI hotspots linking into O2's 4G infrastructure or "backhaul".
The new technology is capable of speeds of up to 150 megabits per second. During the trial, users will be more likely to experience average speeds between 25Mbps and 50Mbps. When 4G is introduced nationally the average speeds are likely to drop to between 10Mbps and 15Mbps. This is faster than 3G, which averages between 1Mbps and 1.5Mbps, and compares well with the average household, fixed line broadband connection, which rose to just under 7Mbps this year.
Live gaming against other players and video calling without delays will become possible from phones, because the speed at which new information loads onto the screen will be reduced from 1 second to 0.07 seconds.
(Adapted from www.guardian.co.uk, Sunday 13, November, 2011)
Without changing the meaning, the expression “in earnest” in line 10 may be replaced by:
Cargo: Técnico Judiciário - Tecnologia da Informação
Ano: 2013
Atenção: Considere o texto a seguir para responder às questões de números 56 a 60.
Google Maps returns to the iPhone
By Hayley Tsukayama, Published: December 13, 2012
Google announced late Wednesday that its Maps app is back for the iPhone.
Google Maps had been the default navigation service on the iPhone since the phone was first released in 2007. But earlier this year, Apple kicked the app off its mobile device and replaced it with its own mapping program, Maps. Many iPhone users balked at the notoriously inaccurate Apple navigation program and the loss of Google Maps, saying they would wait for Google to resubmit its app to the Apple store.
The addition, however, doesn’t let Apple off the hook for improving its own Maps, which is playing catchup with Google as it tries to gather the amount and quality of navigation data it needs to bridge the gap between the two apps. Google Maps is known for being more comprehensive and generally more accurate than the other navigation apps out there.
Google’s new iPhone mapping app adds features from Google’s Android that some Apple users have been longing for such as turn-by-turn directions for walking, mass transit and driving and a smooth integration of Google’s Street View technology. Navigation on the app is quick and intuitive.
Google Maps also pulls transit information from several sites to make it easier to plan trips. In Washington, users can choose ..I.. I they want to go by bus, Metro or some combination of both and get clear route instructions with station names and clearly labeled bus stops.
In this latest version, Google also makes it easier for users to flag the service if it gives the wrong directions. A shake of the iPhone triggers a dialogue box that offers the option to report errors or bugs directly to Google. There is at least one feature from Google Maps for Android that’s missing from the iPhone: the ability to save a portion of a map for offline use, which comes in handy when the phone’s battery runs low.
(Adapted from http://www.washingtonpost.com/business/technology/google-maps-returns-to-the-iphone/2012/12/13/80b92822-4520-11e2-
8061-253bccfc7532_story.html)